812 research outputs found

    Genetics and the general physician: insights, applications and future challenges

    Get PDF
    Scientific and technological advances in our understanding of the nature and consequences of human genetic variation are now allowing genetic determinants of susceptibility to common multifactorial diseases to be defined, as well as our individual response to therapy. I review how genome-wide association studies are robustly identifying new disease susceptibility loci, providing insights into disease pathogenesis and potential targets for drug therapy. Some of the remarkable advances being made using current genetic approaches in Crohn's disease, coronary artery disease and atrial fibrillation are described, together with examples from malaria, HIV/AIDS, asthma, prostate cancer and venous thrombosis which illustrate important principles underpinning this field of research. The limitations of current approaches are also noted, highlighting how much of the genetic risk remains unexplained and resolving specific functional variants difficult. There is a need to more clearly understand the significance of rare variants and structural genomic variation in common disease, as well as epigenetic mechanisms. Specific examples from pharmacogenomics are described including warfarin dosage and prediction of abacavir hypersensitivity that illustrate how in some cases such knowledge is already impacting on clinical practice, while in others prospective evaluation of clinical utility and cost-effectiveness is required to define opportunities for personalized medicine. There is also a need for a broader debate about the ethical implications of current advances in genetics for medicine and society

    Analysis of some algorithms for use on paged virtual memory computers

    Get PDF
    PhD ThesisHandling a single page fault involves execution of thousands of instructions, drum rotational delay and is usually so expensive that if it can be avoided, almost any other cost can be tolerated. Optimizing operating system performance is usually the main concern of computer seientists who deal with paged memories. However, redesigning the algorithm used by a problem program can often result in a very significant reduction in paging, and hence in program execution time. The redesigned algorithm frequently does not satisfy the more conventional efficiency criteria. A sorting algorithm, Hash Coding and other search algorithms are considered. Analytic and simulation studies are presented, and aome modifications are proposed to reduce the number of page faults produced by data set references. Analysis is in terms of three of the most commonly used page replacement algorithms i.e. least recently used, first in first out, and random selection. The modifications are for the most part relatively minor and in some cases have appeared elsewhere in the context of searching on external storage media. The important aspects are the dramatic performance improvements which are possible, and the fact that classical internal algorithms are inappropriate for use in a paged virtual memory system.The Science Research Council: The University of Newcastle Upon Tyne: International Business Machines (United Kingdom) Limited.

    2W/nm Peak-power All-Fiber Supercontinuum Source and its Application to the Characterization of Periodically Poled Nonlinear Crystals

    Full text link
    We demonstrate a uniform high spectral brightness and peak power density all-fiber supercontinuum source. The source consists of a nanosecond Ytterbium fiber laser and an optimal length PCF producing a continuum with a peak power density of 2 W/nm and less than 5 dB of spectral variation between 590 to 1500 nm. The Watt level per nm peak power density enables the use of such sources for the characterization of nonlinear materials. Application of the source is demonstrated with the characterization of several periodically poled crystals.Comment: 8 pages 4 figures v2 includes revisions to the description of the continuum formatio

    Optimal Sizes of Dielectric Microspheres for Cavity QED with Strong Coupling

    Get PDF
    The whispering gallery modes (WGMs) of quartz microspheres are investigated for the purpose of strong coupling between single photons and atoms in cavity quantum electrodynamics (cavity QED). Within our current understanding of the loss mechanisms of the WGMs, the saturation photon number, n, and critical atom number, N, cannot be minimized simultaneously, so that an "optimal" sphere size is taken to be the radius for which the geometric mean, (n x N)^(1/2), is minimized. While a general treatment is given for the dimensionless parameters used to characterize the atom-cavity system, detailed consideration is given to the D2 transition in atomic Cesium (852nm) using fused-silica microspheres, for which the maximum coupling coefficient g/(2*pi)=750MHz occurs for a sphere radius a=3.63microns corresponding to the minimum for n=6.06x10^(-6). By contrast, the minimum for N=9.00x10^(-6) occurs for a sphere radius of a=8.12microns, while the optimal sphere size for which (n x N)^(1/2) is minimized occurs at a=7.83microns. On an experimental front, we have fabricated fused-silica microspheres with radii a=10microns and consistently observed quality factors Q=0.8x10^(7). These results for the WGMs are compared with corresponding parameters achieved in Fabry-Perot cavities to demonstrate the significant potential of microspheres as a tool for cavity QED with strong coupling.Comment: 12 pages, 14 figure

    Time-dependent degradation of photonic crystal fiber attenuation around OH absorption wavelengths

    Get PDF

    Dependability through Assured Reconfiguration in Embedded System Software

    Full text link

    1064 nm laser-induced defects in pure SiO<sub>2</sub> fibers

    Get PDF

    Highly birefringent 98-core fiber

    Get PDF

    Validation of Ultrahigh Dependability for Software-Based Systems

    Get PDF
    Modern society depends on computers for a number of critical tasks in which failure can have very high costs. As a consequence, high levels of dependability (reliability, safety, etc.) are required from such computers, including their software. Whenever a quantitative approach to risk is adopted, these requirements must be stated in quantitative terms, and a rigorous demonstration of their being attained is necessary. For software used in the most critical roles, such demonstrations are not usually supplied. The fact is that the dependability requirements often lie near the limit of the current state of the art, or beyond, in terms not only of the ability to satisfy them, but also, and more often, of the ability to demonstrate that they are satisfied in the individual operational products (validation). We discuss reasons why such demonstrations cannot usually be provided with the means available: reliability growth models, testing with stable reliability, structural dependability modelling, as well as more informal arguments based on good engineering practice. We state some rigorous arguments about the limits of what can be validated with each of such means. Combining evidence from these different sources would seem to raise the levels that can be validated; yet this improvement is not such as to solve the problem. It appears that engineering practice must take into account the fact that no solution exists, at present, for the validation of ultra-high dependability in systems relying on complex software
    • …
    corecore